Training Feed-Forward Neural Networks with Monotonicity Requirements
نویسندگان
چکیده
In this paper, we adapt the classical learning algorithm for feed-forward neural networks when monotonicity is require in the input-output mapping. Such requirements arise, for instance, when prior knowledge of the process being observed is available. Monotonicity can be imposed by the addition of suitable penalization terms to the error function. The objective function, however, depends nonlinearly on the first-order derivatives of the network mapping. We show that these derivatives can easily be obtained by an extension of the standard back-propagation algorithm procedure. This yields a computationally efficient algorithm with little overhead compared to back-propagation.
منابع مشابه
Convergence of Online Gradient Method for Pi-sigma Neural Networks with Inner-penalty Terms
This paper investigates an online gradient method with innerpenalty for a novel feed forward network it is called pi-sigma network. This network utilizes product cells as the output units to indirectly incorporate the capabilities of higherorder networks while using a fewer number of weights and processing units. Penalty term methods have been widely used to improve the generalization performan...
متن کاملPREDICTION OF COMPRESSIVE STRENGTH AND DURABILITY OF HIGH PERFORMANCE CONCRETE BY ARTIFICIAL NEURAL NETWORKS
Neural networks have recently been widely used to model some of the human activities in many areas of civil engineering applications. In the present paper, artificial neural networks (ANN) for predicting compressive strength of cubes and durability of concrete containing metakaolin with fly ash and silica fume with fly ash are developed at the age of 3, 7, 28, 56 and 90 days. For building these...
متن کاملLearning with monotonicity requirements for optimal routing with end-to-end quality of service constraints
In this paper, we adapt the classical learning algorithm for feed-forward neural networks when monotonicity is required in the inputoutput mapping. Monotonicity can be imposed by adding of suitable penalization terms to the error function. This yields a computationally efficient algorithm with little overhead compared to back-propagation. This algorithm is used to train neural networks for dela...
متن کاملHandwritten Character Recognition using Modified Gradient Descent Technique of Neural Networks and Representation of Conjugate Descent for Training Patterns
The purpose of this study is to analyze the performance of Back propagation algorithm with changing training patterns and the second momentum term in feed forward neural networks. This analysis is conducted on 250 different words of three small letters from the English alphabet. These words are presented to two vertical segmentation programs which are designed in MATLAB and based on portions (1...
متن کاملModeling of Resilient Modulus of Asphalt Concrete Containing Reclaimed Asphalt Pavement using Feed-Forward and Generalized Regression Neural Networks
Reclaimed asphalt pavement (RAP) is one of the waste materials that highway agencies promote to use in new construction or rehabilitation of highways pavement. Since the use of RAP can affect the resilient modulus and other structural properties of flexible pavement layers, this paper aims to employ two different artificial neural network (ANN) models for modeling and evaluating the effects of ...
متن کامل